Announcement

Collapse
No announcement yet.

Project Pandora – cloud rendering in 3DS Max

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Project Pandora – cloud rendering in 3DS Max

    never say never

    http://www.maxunderground.com/archives/15172_project_pandora___cloud_rendering_in_3ds_max .html?utm_source=dlvr.it&utm_medium=twitter
    Bobby Parker
    www.bobby-parker.com
    e-mail: info@bobby-parker.com
    phone: 2188206812

    My current hardware setup:
    • Ryzen 9 5900x CPU
    • 128gb Vengeance RGB Pro RAM
    • NVIDIA GeForce RTX 4090 X2
    • ​Windows 11 Pro

  • #2
    The question is, does it work with real renderer?
    Marc Lorenz
    ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___
    www.marclorenz.com
    www.facebook.com/marclorenzvisualization

    Comment


    • #3
      I haven't had a chance to play with Pandora, but our limited experience with the Amazon cloud and GPU rendering was not really all that great. The GPU nodes are quite expensive and only available in some zones of the Amazon cluster and you pay for all data transferred to/from the cloud as well. Transferring a not-so-large 40 MB scene (most of it textures) to the cloud took longer than it took to render it locally.

      On the other hand, I know people who render on the cloud with V-Ray, so obviously in some cases it has an advantage.

      Best regards,
      Vlado
      I only act like I know everything, Rogers.

      Comment


      • #4
        I can't help but think that GPU rendering is a dead end, at least for arch viz.
        If you look at the direction arch viz is taking, you see a huge increase in polycount. A few years ago it was all about GI, now it's all about ultra realistic 3d vegetation. I can't imagine fitting a scene into 2GB anymore.
        I certainly can imagine to fill 32gb ram easily. When will there be GPU's with 32gb?
        My textures didn't really change though, I tend to use the same basic textures for every project, plus some custom ones maybe.
        If there was online storage in the cloud for main assets and textures, like dropbox, all I would need is maybe additional 100mb per project.
        Marc Lorenz
        ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___ ___
        www.marclorenz.com
        www.facebook.com/marclorenzvisualization

        Comment


        • #5
          Originally posted by plastic_ View Post
          If you look at the direction arch viz is taking,
          I fully agree - we're rendering 5k wide stills that need all textures, displacement and geometry to be pin-sharp.
          8gb is a bit too low for that, with 12 still needing some optimising. still miles away from any video card memory.
          Last edited by Neilg; 29-11-2011, 03:42 AM.

          Comment


          • #6
            Originally posted by plastic_ View Post
            A few years ago it was all about GI, now it's all about ultra realistic 3d vegetation.
            that's a very true statement there I'd add super realistic, modeled down to the last screw and crease, furniture too.. some of the best illustrators around are maniacally devoted to accuracy and detail, and even if they aren't obsessive about modeling, grass and trees and huge hdrs will still need a lot of memory.
            it's hard to imagine anyone wanting to take a step back, and go back to scenes that you had to work hard to fit into 32bit systems..

            Comment


            • #7
              maybe next round of video cards will help.
              "Some of the technologies that Nvidia promised to introduce in Kepler and Maxwell (the architecture that will succeed Kepler) include virtual memory space (which will allow CPUs and GPUs to use the "unified" virtual memory)"

              http://www.xbitlabs.com/news/mobile/...ectX_11_1.html

              Comment


              • #8
                The memory space might be unified, but the data will still have to travel from the main memory to the GPU memory through the PCI bus. Maybe if the GPU was integrated on the motherboard it would be easier to share the memory with the CPU...

                Best regards,
                Vlado
                I only act like I know everything, Rogers.

                Comment


                • #9
                  so you do think this would allow to render much bigger scenes with the gpu. and instead of crashing it would get a speed hit?

                  Comment


                  • #10
                    Originally posted by chriserskine View Post
                    so you do think this would allow to render much bigger scenes with the gpu. and instead of crashing it would get a speed hit?
                    I guess that's what we all hope for But we'll see how it turns out.

                    Best regards,
                    Vlado
                    I only act like I know everything, Rogers.

                    Comment

                    Working...
                    X